Hybrid conjugate gradient-BFGS methods based on Wolfe line search
نویسندگان
چکیده
"In this paper, we present some hybrid methods for solving unconstrained optimization problems. These are defined using proper combinations of the search directions and included parameters in conjugate gradient quasi-Newton method Broyden-Fletcher-Goldfarb-Shanno (CG-BFGS). Their global convergence under Wolfe line is analyzed general objective functions. Numerical experiments show superiority modified (CG-BFGS) with respect to existing methods."
منابع مشابه
Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search
and Applied Analysis 3 = ‖d k−1 ‖2 ‖g k−1 ‖4 + 1 ‖g k ‖2 − β2 k (gT k d k−1 ) 2 /‖g k ‖4
متن کاملNew hybrid conjugate gradient methods with the generalized Wolfe line search
The conjugate gradient method was an efficient technique for solving the unconstrained optimization problem. In this paper, we made a linear combination with parameters β k of the DY method and the HS method, and putted forward the hybrid method of DY and HS. We also proposed the hybrid of FR and PRP by the same mean. Additionally, to present the two hybrid methods, we promoted the Wolfe line s...
متن کاملGlobal Convergence of Conjugate Gradient Methods without Line Search
Global convergence results are derived for well-known conjugate gradient methods in which the line search step is replaced by a step whose length is determined by a formula. The results include the following cases: 1. The Fletcher-Reeves method, the Hestenes-Stiefel method, and the Dai-Yuan method applied to a strongly convex LC objective function; 2. The Polak-Ribière method and the Conjugate ...
متن کاملA Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild condi...
متن کاملA Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization
In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Studia Universitatis Babe?-Bolyai
سال: 2022
ISSN: ['1224-8754', '2065-9458']
DOI: https://doi.org/10.24193/subbmath.2022.4.14